Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure mysql_to_gcs fully compatible with MySQL and BigQuery for datetime-related values #15026

Merged
merged 1 commit into from Jun 13, 2021

Conversation

tianjianjiang
Copy link
Contributor

@tianjianjiang tianjianjiang commented Mar 26, 2021

Compatibility issues for datetime-related values:

  • Valid data:

    • BigQuery's range starts from year 0001.
      • TIME range: 00:00:00 to 23:59:59.99999 (but the canonical format says "[.DDDDDD]: Up to six fractional digits")
    • MySQL's range starts from year 1000 (except for an error value starts from year 0000 which is already addressed);
      • TIME range: -838:59:59.000000 to 838:59:59.000000
        • Shall I raise an exception when timedelta.total_seconds() > 86399.99999(9)?
          This PR implicitly raises an OverflowError: date value out of range when timedelta.total_seconds() < 0.
  • Supported output format:

When no schema provided, the current version of the operator assumes that MySQL's DATETIME and DATE values becoming BigQuery's TIMESTAMP, which doesn't accept Unix Epoch integer unless the downstream gcs_to_bigquery actually takes the output of _write_local_schema_file().

Should there be a schema, the current version of the operator sends integers and floats to BigQuery's DATETIME/TIMESTAMP and TIME, respectively. Issues are:

  • For DATETIME, BigQuery will throw errors;
  • For TIMESTAMP, although BigQuery accepts Unix Epoch integer in this case, calendar.timegm() will still cause troubles (unless the downstream gcs_to_bigquery uses a connection that switches to legacy SQL of BigQuery);
    >>> calendar.timegm((1000,1,1,0,0,0))
    -30610224000
  • For TIME, BigQuery may silently convert to incorrect values.

As for DATE, if the schema is indeed sent to the downstream gcs_to_bigquery, then the current version will work. Otherwise, it will still encounter issues of BigQuery's integer-to-TIMESTAMP rejection and negative integers from timege() for edge cases.

A side note:
I suppose some usages might have been designed for BigQuery's legacy SQL dialect. However, since gcs_to_bigquery doesn't explicitly support it (only one line detects it for escaped_table_name), and standard SQL dialect is the default for Python library, it might be safe to follow standard SQL dialect's requirements.


^ Add meaningful description above

Read the Pull Request Guidelines for more information.
In case of fundamental code change, Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in UPDATING.md.

@boring-cyborg boring-cyborg bot added area:providers provider:google Google (including GCP) related issues labels Mar 26, 2021
@boring-cyborg
Copy link

boring-cyborg bot commented Mar 26, 2021

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
Here are some useful points:

  • Pay attention to the quality of your code (flake8, pylint and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: dev@airflow.apache.org
    Slack: https://s.apache.org/airflow-slack

@tianjianjiang tianjianjiang changed the title fix: ensure datetime-related values fully compatible with MySQL and BigQuery Ensure mysql_to_gcs fully compatible with MySQL and BigQuery for datetime-related values Mar 26, 2021
@potiuk
Copy link
Member

potiuk commented Jun 13, 2021

Nice one!

@github-actions github-actions bot added the okay to merge It's ok to merge this PR as it does not require more tests label Jun 13, 2021
@github-actions
Copy link

The PR is likely OK to be merged with just subset of tests for default Python and Database versions without running the full matrix of tests, because it does not modify the core of Airflow. If the committers decide that the full tests matrix is needed, they will add the label 'full tests needed'. Then you should rebase to the latest main or amend the last commit of the PR, and push it with --force-with-lease.

@potiuk potiuk merged commit b272f9c into apache:main Jun 13, 2021
@boring-cyborg
Copy link

boring-cyborg bot commented Jun 13, 2021

Awesome work, congrats on your first merged pull request!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:providers okay to merge It's ok to merge this PR as it does not require more tests provider:google Google (including GCP) related issues
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants